On the Information Dimension of Multivariate Gaussian Processes

نویسندگان

  • Bernhard C. Geiger
  • Tobias Koch
چکیده

The authors have recently defined the Rényi information dimension rate d({Xt}) of a stationary stochastic process {Xt, t ∈ Z} as the entropy rate of the uniformly-quantized process divided by minus the logarithm of the quantizer step size 1/m in the limit as m → ∞ (B. Geiger and T. Koch, “On the information dimension rate of stochastic processes,” in Proc. IEEE Int. Symp. Inf. Theory (ISIT), Aachen, Germany, June 2017). For Gaussian processes with a given spectral distribution function FX , they showed that the information dimension rate equals the Lebesgue measure of the set of harmonics where the derivative of FX is positive. This paper extends this result to multivariate Gaussian processes with a given matrix-valued spectral distribution function FX. It is demonstrated that the information dimension rate equals the average rank of the derivative of FX. As a side result, it is shown that the scale and translation invariance of information dimension carries over from random variables to stochastic processes.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Complete convergence of moving-average processes under negative dependence sub-Gaussian assumptions

The complete convergence is investigated for moving-average processes of doubly infinite sequence of negative dependence sub-gaussian random variables with zero means, finite variances and absolutely summable coefficients. As a corollary, the rate of complete convergence is obtained under some suitable conditions on the coefficients.

متن کامل

On the Information Dimension of Stochastic Processes

In 1959, Rényi proposed the information dimension and the d-dimensional entropy to measure the information content of general random variables. This paper proposes a generalization of information dimension to stationary stochastic processes by defining the information dimension rate as the entropy rate of the uniformly-quantized stochastic process divided by minus the logarithm of the quantizer...

متن کامل

The Rate of Entropy for Gaussian Processes

In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...

متن کامل

Anisotropic Function Estimation Using Multi-bandwidth Gaussian Processes.

In nonparametric regression problems involving multiple predictors, there is typically interest in estimating an anisotropic multivariate regression surface in the important predictors while discarding the unimportant ones. Our focus is on defining a Bayesian procedure that leads to the minimax optimal rate of posterior contraction (up to a log factor) adapting to the unknown dimension and anis...

متن کامل

Analysis of Resting-State fMRI Topological Graph Theory Properties in Methamphetamine Drug Users Applying Box-Counting Fractal Dimension

Introduction: Graph theoretical analysis of functional Magnetic Resonance Imaging (fMRI) data has provided new measures of mapping human brain in vivo. Of all methods to measure the functional connectivity between regions, Linear Correlation (LC) calculation of activity time series of the brain regions as a linear measure is considered the most ubiquitous one. The strength of the dependence obl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1712.07863  شماره 

صفحات  -

تاریخ انتشار 2017